Conversation
Adds bitdeer-ai provider with OpenAI-compatible inference routing, environment-based credential discovery, CLI integration, and docs. Made-with: Cursor
|
Thank you for your submission! We ask that you sign our Developer Certificate of Origin before we can accept your contribution. You can sign the DCO by adding a comment below using this text: I have read the DCO document and I hereby sign the DCO. You can retrigger this bot by commenting recheck in this Pull Request. Posted by the DCO Assistant Lite bot. |
|
Thank you for your interest in contributing to OpenShell, @ruijietey. This project uses a vouch system for first-time contributors. Before submitting a pull request, you need to be vouched by a maintainer. To get vouched:
See CONTRIBUTING.md for details. |
|
You can configure Bitdeer using the That would be the preferred way so we do not have to maintain multiple providers for the same OpenAI protocol. |
Summary
Add Bitdeer AI (
bitdeer-ai) as a first-class inference provider, enabling OpenAI-compatible inference routing through the Bitdeer AI API with environment-based credential discovery viaBITDEERAI_API_KEY.Related Issue
Changes
BitdeerAiProviderplugin inopenshell-providerswithBITDEERAI_API_KEYenv-var discoveryBITDEER_AI_PROFILEinference profile (OpenAI-compatible, Bearer auth, default base URLhttps://api-inference.bitdeer.ai/v1)bitdeer-aiinprofile_for(),ProviderRegistry, andnormalize_provider_type()(accepts bothbitdeer-aiandbitdeer)BitdeerAivariant toCliProviderTypeenum in the CLIbitdeer-aiin the provider reference table (docs/sandboxes/manage-providers.md)Testing
mise run pre-commitpassesbitdeer_ai_profile_has_correct_defaults— verifies profile type, base URL, credential key, and auth headerdiscovers_bitdeer_ai_env_credentials— verifies env-var-based credential discoverynormalize_provider_typetest extended to coverbitdeer-aiandbitdeeraliasesprofile_fortest extended to includebitdeer-ailookupChecklist